On implicit Lagrangian twin support vector regression by Newton method
نویسندگان
چکیده
منابع مشابه
On implicit Lagrangian twin support vector regression by Newton method
In this work, an implicit Lagrangian for the dual twin support vector regression is proposed. Our formulation leads to determining non-parallel ε –insensitive downand upbound functions for the unknown regressor by constructing two unconstrained quadratic programming problems of smaller size, instead of a single large one as in the standard support vector regression (SVR). The two related suppor...
متن کاملTraining Lagrangian twin support vector regression via unconstrained convex minimization
In this paper, a new unconstrained convex minimization problem formulation is proposed as the Lagrangian dual of the 2-norm twin support vector regression (TSVR). The proposed formulation leads to two smaller sized unconstrained minimization problems having their objective functions piece-wise quadratic and differentiable. It is further proposed to apply gradient based iterative method for solv...
متن کاملFinite Newton method for Lagrangian support vector machine classification
An implicit Lagrangian [18] formulation of a support vector machine classifier that led to a highly effective iterative scheme [17] is solved here by a finite Newton method. The proposed method, which is extremely fast and terminates in 6 or 7 iterations, can handle classification problems in very high dimensional spaces, e.g. over 28,000, in a few seconds on a 400 MHz Pentium II machine. The m...
متن کاملReduced twin support vector regression
Wepropose the reduced twin support vector regressor (RTSVR) that uses the notion of rectangular kernels to obtain significant improvements in execution time over the twin support vector regressor (TSVR), thus facilitating its application to larger sized datasets. & 2011 Elsevier B.V. All rights reserved.
متن کاملA weighted twin support vector regression
Twin support vector regression (TSVR) is a new regression algorithm, which aims at finding -insensitive upand down-bound functions for the training points. In order to do so, one needs to resolve a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one in a classical SVR. However, the same penalties are given to the samples in TSVR. In fact, samples in the di...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Computational Intelligence Systems
سال: 2013
ISSN: 1875-6891,1875-6883
DOI: 10.1080/18756891.2013.869900